135 research outputs found

    Detecting microplastics pollution in world oceans using SAR remote sensing

    Get PDF
    Plastic pollution in the world’s oceans is estimated to have reached 270.000 tones, or 5.25 trillion pieces. This plastic is now ubiquitous, however due to ocean circulation patterns, it accumulates in the ocean gyres, creating “garbage patches”. This plastic debris is colonized by microorganisms which create unique bio-film ecosystems. Microbial colonization is the first step towards disintegration and degradation of plastic materials: a process that releases metabolic by-products from energy synthesis. These by-products include the release of short-chain and more complex carbon molecules in the form of surfactants, which we hypothesize will affect the fluid dynamic properties of waves (change in viscosity and surface tension) and make them detectable by SAR sensor. In this study we used Sentinel-1A and COSMO-SkyMed SAR images in selected sites of both the North Pacific and North Atlantic oceans, close to ocean gyres and away from coastal interference. Together with SAR processing we conducted contextual analysis, using ocean geophysical products of the sea surface temperature, surface wind, chlorophyll, wave heights and wave spectrum of the ocean surface. In addition, we started experiments under controlled conditions to test the behaviour of microbes colonizing the two most common pollutants, polyethylene (PE) and polyethylene terephthalate (PET) microplastics. The analysis of SAR images has shown that a combination of surface wind speed and Langmuir cells- ocean circulation pattern is the main controlling factor in creating the distinct appearance of the sea-slicks and microbial bio-films. The preliminary conclusion of our study is that SAR remote sensing may be able to detect plastic pollution in the open oceans and this method can be extended to other areas

    Growing stock volume estimation in temperate forsted areas using a fusion approach with SAR Satellites Imagery

    Get PDF
    Forest monitoring plays a central role in the context of global warming mitigation and in the assessment of forest resources. To meet these challenges, significant efforts have been made by scientists to develop new feasible remote sensing techniques for the retrieval of forest parameters. However, much work remains to be done in this area, in particular in establishing global assessments of forest biomass. In this context, this Ph.D. Thesis presents a complete methodology for estimating Growing Stock Volume (GSV) in temperate forested areas using a fusion approach based on Synthetic-Aperture Radar (SAR) satellite imagery. The investigations which were performed focused on the Thuringian Forest, which is located in Central Germany. The satellite data used are composed of an extensive set of L-band (ALOS PALSAR) and X-band (TerraSAR-X, TanDEM-X, Cosmo-SkyMed) images, which were acquired in various sensor configurations (acquisition modes, polarisations, incidence angles). The available ground data consists of a forest inventory delivered by the local forest offices. Weather measurements and a LiDAR DEM complete the datasets. The research showed that together with the topography, the forest structure and weather conditions generally limited the sensitivity of the SAR signal to GSV. The best correlations were obtained with ALOS PALSAR (R2 = 0.61) and TanDEM-X (R2 = 0.72) interferometric coherences. These datasets were chosen for the retrieval of GSV in the Thuringian Forest and led with regressions to an root-mean-square error (RMSE) in the range of 100─200 m3ha-1. As a final achievement of this thesis, a methodology for combining the SAR information was developed. Assuming that there are sufficient and adequate remote sensing data, the proposed fusion approach may increase the biomass maps accuracy, their spatial extension and their updated frequency. These characteristics are essential for the future derivation of accurate, global and robust forest biomass maps

    Use of a prospective risk analysis method to improve the safety of the cancer chemotherapy process

    Get PDF
    Objective. To perform a risk analysis of the cancer chemotherapy process, by comparing five different organizations. To quantitatively demonstrate the usefulness of centralization and information technologies, to identify residual risks that may be the target of additional actions. Study design. A reengineering of the process started in 1999 and was planned to be finished in 2006. The analysis was performed after the centralization and at the beginning of information technologies integration. Setting. Two thousand two hundred beds university hospital, with medical, surgical, haematological, gynaecological, geriatric, paediatric oncological departments. Twelve thousand cancer chemotherapies each year. Methods. According to the failure modes, effects and criticality analysis (FMECA) method, the failure modes were defined and their criticality indexes were calculated on the basis of the likelihood of occurrence, the potential severity for the patients, and the detection probability. Criticality indexes were compared and the acceptability of residual risks was evaluated. Results. The sum of criticality indexes of 27 identified failure modes was 3596 for the decentralized phase, 2682 for centralization, 2385 for electronic prescription, 2081 for electronic production control, and 1824 for bedside scanning (49% global reduction). The greatest improvements concerned the risk of errors in the production protocols (by a factor of 48), followed by readability problems during transmission (14) and product/dose errors during the production (8). Among the six criticality indexes remaining superior to 100 in the final process, two were judged to be acceptable, whereas further improvements were planned for the four others. Conclusions. Centralization to the pharmacy was associated with a strong improvement but additional developments involving information technologies also contributed to a major risk reduction. A cost-effect analysis confirmed the pertinence of all developments, as the cost per gained criticality point remained stable all over the different phase

    2011-09-22 Minutes of the Executive Committee of the Academic Senate

    Get PDF
    Approved minutes of a meeting of the Executive Committee of the Academic Senate of the University of Dayto

    Laser Noise Reduction in Air

    Get PDF
    Fluctuations of the white-light supercontinuum produced by ultrashort laser pulses in selfguided filaments (spatio-temporal solitons) in air are investigated. We demonstrate that correlations exist within the white-light supercontinuum, and that they can be used to significantly reduce the laser intensity noise by filtering the spectrum. More precisely, the fundamental wavelength is anticorrelated with the wings of the continuum, while conjugated wavelength pairs on both sides of the continuum are strongly correlated. Spectral filtering of the continuum reduces the laser intensity noise by 1.2 dB, showing that fluctuations are rejected to the edges of the spectrum.Comment: 8 page

    Interactive Learning-Based Realizability for Heyting Arithmetic with EM1

    Full text link
    We apply to the semantics of Arithmetic the idea of ``finite approximation'' used to provide computational interpretations of Herbrand's Theorem, and we interpret classical proofs as constructive proofs (with constructive rules for ,\vee, \exists) over a suitable structure \StructureN for the language of natural numbers and maps of G\"odel's system \SystemT. We introduce a new Realizability semantics we call ``Interactive learning-based Realizability'', for Heyting Arithmetic plus \EM_1 (Excluded middle axiom restricted to Σ10\Sigma^0_1 formulas). Individuals of \StructureN evolve with time, and realizers may ``interact'' with them, by influencing their evolution. We build our semantics over Avigad's fixed point result, but the same semantics may be defined over different constructive interpretations of classical arithmetic (Berardi and de' Liguoro use continuations). Our notion of realizability extends intuitionistic realizability and differs from it only in the atomic case: we interpret atomic realizers as ``learning agents''

    Does the advertisement in Swiss pharmacy windows rest on evidence-based medicine? An observational study.

    Get PDF
    OBJECTIVES The aim of the study was to analyse the proportion of evidence-based medication displayed in pharmacies and compare it between the different linguistic regions of the country, at different times of the year to determine the amount of proven effective medications indirectly recommended to the public in different parts of Switzerland. DESIGN This is an observational study conducted by medical doctors in the department of internal medicine at the Spitalzentrum Biel, Switzerland. SETTING The observation took place from July 2019 to May 2020. From a total of 1800 pharmacies in Switzerland, 68 different pharmacies were selected across the 3 main linguistic regions and the medication on display in their windows were examined 4 times a year regarding their efficacy. The displays of medication with or without evidence-based efficacy were described using absolute numbers and proportions and compared between the different linguistic regions at different seasons using χ2. PARTICIPANTS There were no human or animal participants involved in this study. PRIMARY AND SECONDARY OUTCOME MEASURES The primary outcome is the proportion of medication displayed in pharmacy windows with a proven effectiveness in medical literature. The secondary outcome was the variability of the primary outcome over time (seasonal changes), over the different linguistic regions of Switzerland and between chains and privately owned pharmacies. RESULTS We examined 970 medications and found that over the whole year, there is a high proportion of non-evidence-based drugs (56,9%) displayed in pharmacies. Swiss German cantons display significantly more non-evidence-based medications in winter. We found no statistical difference for other seasons or between chains and privately owned pharmacies. CONCLUSION Pharmacies in Switzerland tend to display significantly more non-evidence-based drugs, thus indirectly recommending them to the public. In a time of necessary expansion of self-medication by the population, this could incite consumers to buy drugs without proven effectiveness

    Flow analysis from multiparticle azimuthal correlations

    Get PDF
    We present a new method for analyzing directed and elliptic flow in heavy ion collisions. Unlike standard methods, it separates the contribution of flow to azimuthal correlations from contributions due to other effects. The separation relies on a cumulant expansion of multiparticle azimuthal correlations, and includes corrections for detector inefficiencies. This new method allows the measurement of the flow of identified particles in narrow phase-space regions, and can be used in every regime, from intermediate to ultrarelativistic energies.Comment: 31 pages, revtex. Published version (references added
    corecore